22 research outputs found

    A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    Full text link
    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI papers published on JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/jins

    Optimization of Planck/LFI on--board data handling

    Get PDF
    To asses stability against 1/f noise, the Low Frequency Instrument (LFI) onboard the Planck mission will acquire data at a rate much higher than the data rate allowed by its telemetry bandwith of 35.5 kbps. The data are processed by an onboard pipeline, followed onground by a reversing step. This paper illustrates the LFI scientific onboard processing to fit the allowed datarate. This is a lossy process tuned by using a set of 5 parameters Naver, r1, r2, q, O for each of the 44 LFI detectors. The paper quantifies the level of distortion introduced by the onboard processing, EpsilonQ, as a function of these parameters. It describes the method of optimizing the onboard processing chain. The tuning procedure is based on a optimization algorithm applied to unprocessed and uncompressed raw data provided either by simulations, prelaunch tests or data taken from LFI operating in diagnostic mode. All the needed optimization steps are performed by an automated tool, OCA2, which ends with optimized parameters and produces a set of statistical indicators, among them the compression rate Cr and EpsilonQ. For Planck/LFI the requirements are Cr = 2.4 and EpsilonQ <= 10% of the rms of the instrumental white noise. To speedup the process an analytical model is developed that is able to extract most of the relevant information on EpsilonQ and Cr as a function of the signal statistics and the processing parameters. This model will be of interest for the instrument data analysis. The method was applied during ground tests when the instrument was operating in conditions representative of flight. Optimized parameters were obtained and the performance has been verified, the required data rate of 35.5 Kbps has been achieved while keeping EpsilonQ at a level of 3.8% of white noise rms well within the requirements.Comment: 51 pages, 13 fig.s, 3 tables, pdflatex, needs JINST.csl, graphicx, txfonts, rotating; Issue 1.0 10 nov 2009; Sub. to JINST 23Jun09, Accepted 10Nov09, Pub.: 29Dec09; This is a preprint, not the final versio

    Off-line radiometric analysis of Planck/LFI data

    Get PDF
    The Planck Low Frequency Instrument (LFI) is an array of 22 pseudo-correlation radiometers on-board the Planck satellite to measure temperature and polarization anisotropies in the Cosmic Microwave Background (CMB) in three frequency bands (30, 44 and 70 GHz). To calibrate and verify the performances of the LFI, a software suite named LIFE has been developed. Its aims are to provide a common platform to use for analyzing the results of the tests performed on the single components of the instrument (RCAs, Radiometric Chain Assemblies) and on the integrated Radiometric Array Assembly (RAA). Moreover, its analysis tools are designed to be used during the flight as well to produce periodic reports on the status of the instrument. The LIFE suite has been developed using a multi-layered, cross-platform approach. It implements a number of analysis modules written in RSI IDL, each accessing the data through a portable and heavily optimized library of functions written in C and C++. One of the most important features of LIFE is its ability to run the same data analysis codes both using ground test data and real flight data as input. The LIFE software suite has been successfully used during the RCA/RAA tests and the Planck Integrated System Tests. Moreover, the software has also passed the verification for its in-flight use during the System Operations Verification Tests, held in October 2008.Comment: Planck LFI technical papers published by JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/1748-022

    FACT -- The G-APD revolution in Cherenkov astronomy

    Full text link
    Since two years, the FACT telescope is operating on the Canary Island of La Palma. Apart from its purpose to serve as a monitoring facility for the brightest TeV blazars, it was built as a major step to establish solid state photon counters as detectors in Cherenkov astronomy. The camera of the First G-APD Cherenkov Telesope comprises 1440 Geiger-mode avalanche photo diodes (G-APD), equipped with solid light guides to increase the effective light collection area of each sensor. Since no sense-line is available, a special challenge is to keep the applied voltage stable although the current drawn by the G-APD depends on the flux of night-sky background photons significantly varying with ambient light conditions. Methods have been developed to keep the temperature and voltage dependent response of the G-APDs stable during operation. As a cross-check, dark count spectra with high statistics have been taken under different environmental conditions. In this presentation, the project, the developed methods and the experience from two years of operation of the first G-APD based camera in Cherenkov astronomy under changing environmental conditions will be presented.Comment: Proceedings of the Nuclear Science Symposium and Medical Imaging Conference (IEEE-NSS/MIC), 201

    FACT - The First G-APD Cherenkov Telescope: Status and Results

    Full text link
    The First G-APD Cherenkov telescope (FACT) is the first telescope using silicon photon detectors (G-APD aka. SiPM). It is built on the mount of the HEGRA CT3 telescope, still located at the Observatorio del Roque de los Muchachos, and it is successfully in operation since Oct. 2011. The use of Silicon devices promises a higher photon detection efficiency, more robustness and higher precision than photo-multiplier tubes. The FACT collaboration is investigating with which precision these devices can be operated on the long-term. Currently, the telescope is successfully operated from remote and robotic operation is under development. During the past months of operation, the foreseen monitoring program of the brightest known TeV blazars has been carried out, and first physics results have been obtained including a strong flare of Mrk501. An instantaneous flare alert system is already in a testing phase. This presentation will give an overview of the project and summarize its goals, status and first results

    Design and Operation of FACT -- The First G-APD Cherenkov Telescope

    Full text link
    The First G-APD Cherenkov Telescope (FACT) is designed to detect cosmic gamma-rays with energies from several hundred GeV up to about 10 TeV using the Imaging Atmospheric Cherenkov Technique. In contrast to former or existing telescopes, the camera of the FACT telescope is comprised of solid-state Geiger-mode Avalanche Photodiodes (G-APD) instead of photomultiplier tubes for photo detection. It is the first full-scale device of its kind employing this new technology. The telescope is operated at the Observatorio del Roque de los Muchachos (La Palma, Canary Islands, Spain) since fall 2011. This paper describes in detail the design, construction and operation of the system, including hardware and software aspects. Technical experiences gained after one year of operation are discussed and conclusions with regard to future projects are drawn.Comment: Corresponding authors: T. Bretz and Q. Weitze

    Planck early results V : The Low Frequency Instrument data processing

    Get PDF
    Peer reviewe

    The INTEGRAL archive

    No full text

    Online data analysis system of the INTEGRAL telescope

    No full text
    International audienceContext. During more than 17 years of operation in space, the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) telescope has accumulated a large data set that contains records of hard X-ray and soft γ-ray astronomical sources. These data can be reused in the context of multi-wavelength or multi-messenger studies of astronomical sources and have to be preserved on long timescales.Aims. We present a scientific validation of an interactive online INTEGRAL data analysis system for multi-wavelength studies of hard X-ray and soft γ-ray sources.Methods. The online data analysis system generates publication-quality high-level data products: sky images, spectra, and light curves in response to user queries that define analysis parameters such as source position, time and energy interval, and binning. The data products can be requested via a web browser interface or via an application programming interface that is available as a Python package. The products for the Imager on Board the INTEGRAL Satellite/INTEGRAL Soft Gamma-Ray Imager instrument of INTEGRAL are generated using the offline science analysis (OSA) software, which is provided by the instrument teams and is conventionally used to analyse INTEGRAL data. The analysis workflow is organised to preserve and reuse various intermediate analysis products, ensuring that frequently requested results are available without delay. The platform is implemented in a Docker cluster that allows operation of the software in a controlled virtual environment and can be deployed in any compatible infrastructure. The scientific results produced by the open data analysis (ODA) are identical to those produced by OSA because ODA simply provides a platform to retrieve the OSA results online while leveraging a provenance-indexed database of precomputed (cached) results to optimise and reuse the result.Results. We report the functionalities and performance of the online data analysis system by reproducing the benchmark INTEGRAL results on different types of sources, including bright steady and transient Galactic sources, and bright and weak variable extra-galactic sources. We compare the results obtained with the online data analysis system with previously published results on these sources. We also discuss limitations of the online analysis system.Conclusions. We consider the INTEGRAL online data analysis as a demonstration of a more general web-based ‘data analysis as a service’ approach that provides a promising solution for the preservation and maintenance of data analysis tools of astronomical telescopes on (multi)decade long timescales and facilitates combining data in multi-wavelength and multi-messenger studies of astronomical sources.Key words: methods: data analysis / X-rays: general / methods: observationa
    corecore